An n-th section line search in conjugate gradient method for small-scale unconstrained optimization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization

In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...

متن کامل

A memory gradient method without line search for unconstrained optimization

Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and subsequently extended by Cragg and Levy (1969). Recently Narushima and Yabe (2006) proposed a new memory gradient method which generates a descent search direction for the objective function at every iteration a...

متن کامل

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Malaysian Journal of Fundamental and Applied Sciences

سال: 2017

ISSN: 2289-599X,2289-5981

DOI: 10.11113/mjfas.v0n0.579